Beating the Multiplicative Weights Update Algorithm

نویسندگان

  • Abhinav Aggarwal
  • José Abel Castellanos
  • Diksha Gupta
چکیده

Multiplicative weights update algorithms have been used extensively in designing iterative algorithms for many computational tasks. The core idea is to maintain a distribution over a set of experts and update this distribution in an online fashion based on the parameters of the underlying optimization problem. In this report, we study the behavior of a special MWU algorithm used for generating a global coin flip in the presence of an adversary that tampers the experts’ advice. Specifically, we focus our attention on two adversarial strategies: (1) non-adaptive, in which the adversary chooses a fixed set of experts a priori and corrupts their advice in each round; and (2) adaptive, in which this set is chosen as the rounds of the algorithm progress. We formulate these adversarial strategies as being greedy in terms of trying to maximize the share of the corrupted experts in the final weighted advice the MWU computes and provide the underlying optimization problem that needs to be solved to achieve this goal. We provide empirical results to show that in the presence of either of the above adversaries, the MWU algorithm takes O(n) rounds in expectation to produce the desired output. This result compares well with the current state of the art of O(n) for the general Byzantine consensus problem. Finally, we briefly discuss the extension of these adversarial strategies for a general MWU algorithm and provide an outline for the framework in that setting.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Multiplicative Weights Update Method: a Meta-Algorithm and Applications

Algorithms in varied fields use the idea of maintaining a distribution over a certain set and use the multiplicative update rule to iteratively change these weights. Their analysis are usually very similar and rely on an exponential potential function. We present a simple meta algorithm that unifies these disparate algorithms and drives them as simple instantiations of the meta algorithm.

متن کامل

The Multiplicative Weights Update Method: A Meta Algorithm and its Applications

Algorithms in varied fields use the idea of maintaining a distribution over a certain set and use the multiplicative update rule to iteratively change these weights. Their analyses are usually very similar and rely on an exponential potential function. In this survey we present a simple meta algorithm that unifies these disparate algorithms and drives them as simple instantiations of the meta a...

متن کامل

A multiplicative weights update algorithm for MINLP

We discuss an application of the well-known Multiplicative Weights Update (MWU) algorithm to non-convex and mixed-integer nonlinear programming. We present applications to: (a) the distance geometry problem, which arises in the positioning of mobile sensors and in protein conformation; (b) a hydro unit commitment problem arising in the energy industry, and (c) a class of Markowitz’ portfolio se...

متن کامل

On Sex, Evolution, and the Multiplicative Weights Update Algorithm

We consider a recent innovative theory by Chastain et al. on the role of sex in evolution [10]. In short, the theory suggests that the evolutionary process of gene recombination implements the celebrated multiplicative weights updates algorithm (MWUA). They prove that the population dynamics induced by sexual reproduction can be precisely modeled by genes that use MWUA as their learning strateg...

متن کامل

Learning Linear Functions with Quadratic and Linear Multiplicative Updates

We analyze variations of multiplicative updates for learning linear functions online. These can be described as substituting exponentiation in the Exponentiated Gradient (EG) algorithm with quadratic and linear functions. Both kinds of updates substitute exponentiation with simpler operations and reduce dependence on the parameter that specifies the sum of the weights during learning. In partic...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1708.04668  شماره 

صفحات  -

تاریخ انتشار 2017